The development of audiovisual speech perception
نویسندگان
چکیده
The seemingly effortless and incidental way in which infants acquire and perfect the ability to use spoken language/s is quite remarkable considering the complexity of human linguistic systems. Parsing the spoken signal successfully is in fact anything but trivial, as attested by the challenge posed by learning a new language later in life, let alone attaining native-like proficiency. It is now widely accepted that (adult) listeners tend to make use of as many cues as there are available to them in order to decode the spoken signal effectively (e.g. Cutler 1997 ; Soto-Faraco et al. 2001 ). Different sources of information not only span various linguistic levels (acoustics, phonology, lexical, morphology, syntax, semantics, pragmatics, etc.), but also encompass different sensory modalities such as vision and audition (Campbell et al. 1998 ) and even touch (see Gick and Derrick 2009 ). Thus, like most everyday life perceptual experiences (Gibson 1966 ; Lewkowicz 2000a ; Stein and Meredith 1993 ), spoken communication involves multisensory inputs. In the particular case of speech, the receiver in a typical face-to-face conversation has access to the sounds as well as to the corresponding visible articulatory gestures made by the speaker. The integration of heard and seen speech information has received a good deal of attention in the literature, and its consequences have been repeatedly documented at the behavioral (e.g. Ma et al. 2009 ; McGurk and MacDonald 1976 ; Ross et al. 2007 ; Sumby and Polack 1954 ) as well as the physiological (e.g. Calvert et al. 2000 ) level. The focus of the present chapter is on the developmental course of multisensory processing mechanisms that facilitate language acquisition, and on the contribution of multisensory information to the development of speech perception. We argue that the plastic mechanisms leading to the acquisition of a language, whether in infancy or later on, are sensitive to the correlated and often complementary nature of multiple crossmodal sources of linguistic information. These crossmodal correspondences are used to decode and represent the speech signal from the first months of life.
منابع مشابه
The development of face perception in infancy: intersensory interference and unimodal visual facilitation.
Although research has demonstrated impressive face perception skills of young infants, little attention has focused on conditions that enhance versus impair infant face perception. The present studies tested the prediction, generated from the intersensory redundancy hypothesis (IRH), that face discrimination, which relies on detection of visual featural information, would be impaired in the con...
متن کاملThe development of sensorimotor influences in the audiovisual speech domain: some critical questions
Speech researchers have long been interested in how auditory and visual speech signals are integrated, and the recent work has revived interest in the role of speech production with respect to this process. Here, we discuss these issues from a developmental perspective. Because speech perception abilities typically outstrip speech production abilities in infancy and childhood, it is unclear how...
متن کاملAudiovisual speech integration and lipreading in autism.
BACKGROUND During speech perception, the ability to integrate auditory and visual information causes speech to sound louder and be more intelligible, and leads to quicker processing. This integration is important in early language development, and also continues to affect speech comprehension throughout the lifespan. Previous research shows that individuals with autism have difficulty integrati...
متن کاملTeaching and learning guide for audiovisual speech perception: A new approach and implications for clinical populations
When a speaker talks, the visible consequences of what they are saying can be seen. This auditory (the speech sound) and visual (movements of the lips and other articulators), or AV speech influences what listeners hear both in noisy listening environments and when auditory speech can easily be heard. Thought to be a cross‐cultural phenomenon that emerges early in typical language development, ...
متن کاملLip movements affect infants' audiovisual speech perception.
Speech is robustly audiovisual from early in infancy. Here we show that audiovisual speech perception in 4.5-month-old infants is influenced by sensorimotor information related to the lip movements they make while chewing or sucking. Experiment 1 consisted of a classic audiovisual matching procedure, in which two simultaneously displayed talking faces (visual [i] and [u]) were presented with a ...
متن کاملLanguage/Culture Modulates Brain and Gaze Processes in Audiovisual Speech Perception
Several behavioural studies have shown that the interplay between voice and face information in audiovisual speech perception is not universal. Native English speakers (ESs) are influenced by visual mouth movement to a greater degree than native Japanese speakers (JSs) when listening to speech. However, the biological basis of these group differences is unknown. Here, we demonstrate the time-va...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2012